-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hyperparameters for datamodule #3792
Hyperparameters for datamodule #3792
Conversation
We want to use the hyperparameter saving code in the datamodule, too.
A DataModule can now save its hyperparameters just like a LightningModule.
The function takes a dict or namespace and adds the contained hparams to the existing ones. If a hparam already exists, an error is thrown to avoid overwriting it.
To log and checkpoint the hparams of the DataModule together with the model, we add them to the model's hparams before training.
This pull request is now in conflict... :( |
Hello @tilman151! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2021-07-09 14:38:05 UTC |
Codecov Report
@@ Coverage Diff @@
## master #3792 +/- ##
=======================================
- Coverage 93% 88% -5%
=======================================
Files 215 216 +1
Lines 13988 14009 +21
=======================================
- Hits 12965 12329 -636
- Misses 1023 1680 +657 |
This pull request is now in conflict... :( |
# Conflicts: # pytorch_lightning/core/lightning.py # tests/core/test_datamodules.py # tests/models/test_hparams.py
The function for adding hparams is now only available in lightning module. The error message is now specific enough without reraising it in the training loop.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great feature, very much looking forward to it! I tried it out and it works great!
I am convinced adding back the setter is the wrong decision. This will get us in the same trouble again that we had before we removed it. Please reconsider this change
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, minor comment
Co-authored-by: Ethan Harris <ewah1g13@soton.ac.uk>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
Should this be documented in the datamodule docs? |
What does this PR do?
Fixes #3769
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃